# Multi-round Training
T5 Small Finetuned V2 Hausa To Chinese
Apache-2.0
A Hausa-to-Chinese translation model fine-tuned based on T5-small, achieving a BLEU score of 30.0183 on the evaluation set.
Machine Translation
Transformers

T
Kumshe
15
1
Marian Finetuned Iitb En To Hi
This is an English-to-Hindi neural machine translation model based on the Marian framework, fine-tuned by the Indian Institute of Technology Bombay
Machine Translation
Transformers

M
Shiva26
53
1
Finetuned Wav2vec2.0 Base On IEMOCAP 2
Apache-2.0
This is a speech emotion recognition model based on the facebook/wav2vec2-base model fine-tuned on the IEMOCAP dataset, achieving 73.9% accuracy on the evaluation set.
Audio Classification
Transformers

F
minoosh
32
2
Nick Asr LID
An automatic speech recognition model trained on an unknown dataset, supporting language identification tasks
Speech Recognition
Transformers

N
ntoldalagi
28
0
Wav2vec2 Base Checkpoint 9
Apache-2.0
This model is a fine-tuned speech recognition model based on wav2vec2-base-checkpoint-8 on the common_voice dataset, achieving a word error rate of 0.3258 on the evaluation set.
Speech Recognition
Transformers

W
jiobiala24
16
0
Finetuned
BibliBERT is a fine-tuned version based on the Italian pre-trained model dbmdz/bert-base-italian-xxl-cased, primarily used for masked language modeling tasks.
Large Language Model
Transformers

F
vppvgit
18
0
Bert Small Mnli
This model is a PyTorch pretrained model obtained by converting TensorFlow checkpoints from the official Google BERT repository, originating from the paper 'Well-Read Students Learn Better: On the Importance of Pre-training Compact Models,' and trained on the MNLI dataset.
Large Language Model
B
prajjwal1
29
0
Featured Recommended AI Models